full transcript

From the Ted Talk by Joy Buolamwini: How I'm fighting bias in algorithms

Unscramble the Blue Letters

Now you've seen in my examples how social rbotos was how I found out about exclusion with amihloigrtc bias. But algorithmic bias can also lead to discriminatory practices. Across the US, police denttermpas are sinrattg to use facial recognition software in their crime-fighting arsenal. Georgetown Law published a report showing that one in two adults in the US — that's 117 million people — have their faces in facial recognition networks. Police departments can currently look at these networks ueturgeland, using algorithms that have not been atduied for accuracy. Yet we know facial recognition is not fail proof, and labeling faces consistently remains a challenge. You might have seen this on Facebook. My friends and I laugh all the time when we see other polepe mbelalsied in our pothos. But misidentifying a suspected criminal is no laughing matter, nor is breaching civil liberties.

Open Cloze

Now you've seen in my examples how social ______ was how I found out about exclusion with ___________ bias. But algorithmic bias can also lead to discriminatory practices. Across the US, police ___________ are ________ to use facial recognition software in their crime-fighting arsenal. Georgetown Law published a report showing that one in two adults in the US — that's 117 million people — have their faces in facial recognition networks. Police departments can currently look at these networks ___________, using algorithms that have not been _______ for accuracy. Yet we know facial recognition is not fail proof, and labeling faces consistently remains a challenge. You might have seen this on Facebook. My friends and I laugh all the time when we see other ______ __________ in our ______. But misidentifying a suspected criminal is no laughing matter, nor is breaching civil liberties.

Solution

  1. unregulated
  2. robots
  3. starting
  4. people
  5. mislabeled
  6. algorithmic
  7. photos
  8. departments
  9. audited

Original Text

Now you've seen in my examples how social robots was how I found out about exclusion with algorithmic bias. But algorithmic bias can also lead to discriminatory practices. Across the US, police departments are starting to use facial recognition software in their crime-fighting arsenal. Georgetown Law published a report showing that one in two adults in the US — that's 117 million people — have their faces in facial recognition networks. Police departments can currently look at these networks unregulated, using algorithms that have not been audited for accuracy. Yet we know facial recognition is not fail proof, and labeling faces consistently remains a challenge. You might have seen this on Facebook. My friends and I laugh all the time when we see other people mislabeled in our photos. But misidentifying a suspected criminal is no laughing matter, nor is breaching civil liberties.

Frequently Occurring Word Combinations

ngrams of length 2

collocation frequency
algorithmic bias 6
facial recognition 6
training sets 5
code matters 4
recognition software 3
machine learning 3
start thinking 3
discriminatory practices 2
coded gaze 2
generic facial 2
computer vision 2
police departments 2
social change 2
inclusive training 2

ngrams of length 3

collocation frequency
facial recognition software 3
generic facial recognition 2
inclusive training sets 2

ngrams of length 4

collocation frequency
generic facial recognition software 2

Important Words

  1. accuracy
  2. adults
  3. algorithmic
  4. algorithms
  5. arsenal
  6. audited
  7. bias
  8. breaching
  9. challenge
  10. civil
  11. consistently
  12. criminal
  13. departments
  14. discriminatory
  15. examples
  16. exclusion
  17. facebook
  18. faces
  19. facial
  20. fail
  21. friends
  22. georgetown
  23. labeling
  24. laugh
  25. laughing
  26. law
  27. lead
  28. liberties
  29. matter
  30. million
  31. misidentifying
  32. mislabeled
  33. networks
  34. people
  35. photos
  36. police
  37. practices
  38. proof
  39. published
  40. recognition
  41. remains
  42. report
  43. robots
  44. showing
  45. social
  46. software
  47. starting
  48. suspected
  49. time
  50. unregulated